Gaussian Mixtures and Tensor Decompositions

نویسنده

  • Cyril Zhang
چکیده

1 Tensors Tensors are the generalizations of vectors v ∈ R and matrices M ∈ Rm×n. These are respectively 1-tensors and 2-tensors, which we can represent using oneand two-dimensional arrays of real numbers. Although there are more general definitions, for our purposes, a p-th order tensor (or a p-tensor) is an object that can be represented using a p-dimensional array of real numbers. Technically, today by “tensor” we’re strictly referring to covariant Cartesian tensors. We can add tensors of the same order and shape, and multiply them by scalars.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Mixtures of Discrete Product Distributions using Spectral Decompositions

We study the problem of learning a distribution from samples, when the underlying distribution is a mixture of product distributions over discrete domains. This problem is motivated by several practical applications such as crowdsourcing, recommendation systems, and learning Boolean functions. The existing solutions either heavily rely on the fact that the number of mixtures is finite or have s...

متن کامل

Blind identification and source separation in 2×3 under-determined mixtures

Under-determined mixtures are characterized by the fact that they have more inputs than outputs, or, with the antenna array processing terminology, more sources than sensors. The problem addressed is that of identifying and inverting the mixture, which obviously does not admit a linear inverse. Identification is carried out with the help of tensor canonical decompositions. On the other hand, th...

متن کامل

The More, the Merrier: the Blessing of Dimensionality for Learning Large Gaussian Mixtures

In this paper we show that very large mixtures of Gaussians are efficiently learnable in high dimension. More precisely, we prove that a mixture with known identical covariance matrices whose number of components is a polynomial of any fixed degree in the dimension n is polynomially learnable as long as a certain non-degeneracy condition on the means is satisfied. It turns out that this conditi...

متن کامل

Tensor Decompositions for Learning Latent Variable Models Report Title

This work considers a computationally and statistically e?cient parameter estimation method for a wide class of latent variable models—including Gaussian mixture models, hidden Markov models, and latent Dirichlet allocation—which exploits a certain tensor structure in their loworder observable moments (typically, of secondand third-order). Speci?cally, parameter estimation is reduced to the pro...

متن کامل

Scalable Latent Tree Model and its Application to Health Analytics

We present an integrated approach to structure and parameter estimation in latent tree graphical models, where some nodes are hidden. Our approach follows a “divide-and-conquer” strategy, and learns models over small groups of variables (where the grouping is obtained through preprocessing). A global solution is obtained in the end through simple merge steps. Our structure learning procedure in...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015